# Multilingual Code Understanding
Codeberta Small V1
CodeBERTa is a code understanding model based on the RoBERTa architecture, specifically trained for multiple programming languages, capable of efficiently handling code-related tasks.
Large Language Model
Transformers Other

C
claudios
16
1
Graphcodebert Base
A Transformer-based graph-structured pretrained model specifically designed for programming languages, combining code sequences with data flow information.
Large Language Model
G
microsoft
59.23k
67
Featured Recommended AI Models